-
Notifications
You must be signed in to change notification settings - Fork 280
feat: Add AWS Bedrock LLM Support #1173
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add AWS Bedrock LLM Support #1173
Conversation
|
@georgeh0 @badmonster0 please help me review this PR |
|
thanks @belloibrahv could you share how you tested - using which example, and what are the results look like? |
|
Hi @badmonster0, Thanks for the feedback. I've updated the pull request to include a new example that demonstrates how to test the AWS Bedrock integration. Testing Example and WorkflowTo answer your question about testing, I've added a new example located at WorkflowThe example demonstrates a real-world use case:
How to Run the TestYou can run the test by following the instructions in the new
Expected ResultsAfter a successful run, a SELECT filename, module_info->'title' AS title, module_summary FROM modules_info;The output should look similar to this: Current StatusI've fully prepared this example and confirmed that it's ready for verification. However, I'm currently blocked from running the final test myself due to an issue with my AWS account verification, which is preventing me from getting a Since the example is now included in the PR, I hope it makes it easy for you to test the changes on your end. Thanks again for your guidance and support. |
This commit adds support for AWS Bedrock for LLM parsing. The implementation follows the approach of other LLM providers and uses the `BEDROCK_API_KEY` and `BEDROCK_REGION` environment variables for authentication. This resolves issue cocoindex-io#1162.
ddc60ae to
736b580
Compare
georgeh0
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you also help update the doc to add this new LLM integration? Thanks!
| def test_llm_api_type_bedrock() -> None: | ||
| """Test that LlmApiType.BEDROCK is available and works.""" | ||
| from cocoindex.llm import LlmApiType, LlmSpec | ||
|
|
||
| # Test enum availability | ||
| assert hasattr(LlmApiType, "BEDROCK") | ||
| assert LlmApiType.BEDROCK.value == "Bedrock" | ||
|
|
||
| # Test LlmSpec creation with Bedrock | ||
| spec = LlmSpec( | ||
| api_type=LlmApiType.BEDROCK, model="us.anthropic.claude-3-5-haiku-20241022-v1:0" | ||
| ) | ||
|
|
||
| assert spec.api_type == LlmApiType.BEDROCK | ||
| assert spec.model == "us.anthropic.claude-3-5-haiku-20241022-v1:0" | ||
| assert spec.address is None | ||
| assert spec.api_config is None | ||
|
|
||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This test is unnecessary. It doesn't test any logic really needs to be tested. Please remove it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't need to fork the example for just another LLM API integration. We only need to add a few lines in the existing example like this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have implemented the changes as requested, kindly help me to review when you have time, Thank You
- Update documentation to include Bedrock LLM integration. - Remove unnecessary test for LlmApiType.BEDROCK. - Add Bedrock to the existing manuals_llm_extraction example instead of creating a new one.
|
hi @belloibrahv https://cocoindex.io/blogs/cocoindex-changelog-2025-10-19#belloibrahv latest release note out and we have a section for you, thanks for contributing! |
The implementation follows the approach of other LLM providers and uses the
BEDROCK_API_KEYandBEDROCK_REGIONenvironment variables for authentication.This resolves issue #1162.